540 research outputs found

    A simpler puzzle of ground

    Get PDF
    Metaphysical grounding is standardly taken to be irreflexive: nothing grounds itself. Kit Fine has presented some puzzles that appear to contradict this principle. I construct a particularly simple variant of those puzzles that is independent of several of the assumptions required by Fine, instead employing quantification into sentence position. Various possible responses to Fine's puzzles thus turn out to apply only in a restricted range of cases

    Towards a theory of ground-theoretic content

    Get PDF
    A lot of research has recently been done on the topic of ground, and in particular on the logic of ground. According to a broad consensus in that debate, ground is hyperintensional in the sense that even logically equivalent truths may differ with respect to what grounds them, and what they ground. This renders pressing the question of what we may take to be the ground-theoretic content of a true statement, i.e. that aspect of the statement's overall content to which ground is sensitive. I propose a novel answer to this question, namely that ground tracks how, rather than just by what, a statement is made true. I develop that answer in the form of a formal theory of ground-theoretic content and show how the resulting framework may be used to articulate plausible theories of ground, including in particular a popular account of the grounds of truth-functionally complex truths that has proved difficult to accommodate on alternative views of content

    Semantic values in higher-order semantics

    Get PDF
    Recently, some philosophers have argued that we should take quantification of any (finite) order to be a legitimate and irreducible, sui generis kind of quantification. In particular, they hold that a semantic theory for higher-order quantification must itself be couched in higher-order terms. Øystein Linnebo has criticized such views on the grounds that they are committed to general claims about the semantic values of expressions that are by their own lights inexpressible. I show that Linnebo's objection rests on the assumption of a notion of semantic value or contribution which both applies to expressions of any order, and picks out, for each expression, an extra-linguistic correlate of that expression. I go on to argue that higher-orderists can plausibly reject this assumption, by means of a hierarchy of notions they can use to describe the extra-lingustic correlates of expressions of different orders

    Everything, and then some

    Get PDF
    On its intended interpretation, logical, mathematical and metaphysical discourse sometimes seems to involve absolutely unrestricted quantification. Yet our standard semantic theories do not allow for interpretations of a language as expressing absolute generality. A prominent strategy for defending absolute generality, influentially proposed by Timothy Williamson in his paper ‘Everything’ (2003), avails itself of a hierarchy of quantifiers of ever increasing orders to develop non-standard semantic theories that do provide for such interpretations. However, as emphasized by Øystein Linnebo and Agustín Rayo (2012), there is pressure on this view to extend the quantificational hierarchy beyond the finite level, and, relatedly, to allow for a cumulative conception of the hierarchy. In his recent book, Modal Logic as Metaphysics (2013), Williamson yields to that pressure. I show that the emerging cumulative higher-orderist theory has implications of a strongly generality-relativist flavour, and consequently undermines much of the spirit of generality absolutism that Williamson set out to defend

    A hyperintensional criterion of irrelevance

    Get PDF
    On one important notion of irrelevance, evidence that is irrelevant in an inquiry may rationally be discarded, and attempts to obtain evidence amount to a waste of resources if they are directed at irrelevant evidence. The familiar Bayesian criterion of irrelevance, whatever its merits, is not adequate with respect to this notion. I show that a modification of the criterion due to Ken Gemes, though a significant improvement, still has highly implausible consequences. To make progress, I argue, we need to adopt a hyperintensional conception of content. I go on to formulate a better, hyperintensional criterion of irrelevance, drawing heavily on the framework of the truthmaker conception of propositions as recently developed by Kit Fine

    Difference-making grounds

    Get PDF
    We define a notion of difference-making for partial grounds of a fact in rough analogy to existing notions of difference-making for causes of an event. Using orthodox assumptions about ground, we show that it induces a non-trivial division with examples of partial grounds on both sides. We then demonstrate the theoretical fruitfulness of the notion by applying it to the analysis of a certain kind of putative counter-example to the transitivity of ground recently described by Jonathan Schaffer. First, we show that our conceptual apparatus of difference-making enables us to give a much clearer description than Schaffer does of what makes the relevant instances of transitivity appear problematic. Second, we suggest that difference-making is best seen as a mark of good grounding-based explanations rather than a necessary condition on grounding, and argue that this enables us to deal with the counter-example in a satisfactory way. Along the way, we show that Schaffer's own proposal for salvaging a form of transitivity by moving to a contrastive conception of ground is unsuccessful. We conclude by sketching some natural strategies for extending our proposal to a more comprehensive account of grounding-based explanations

    Evaluation of the online-based self-help programme “Selfapy” in patients with unipolar depression: study protocol for a randomized, blinded parallel group dismantling study

    Get PDF
    Background: Patients with mild to moderate depressive symptoms can have limited access to regular treatment; to ensure appropriate care, low-threshold treatment is needed. Effective online interventions could increase the supply of low-threshold treatment. Further research is needed to evaluate the effectiveness of online interventions. This study aims to evaluate the online-based self-help programme "Selfapy" on a sample of depressive subjects and compares the impact of the programme's unaccompanied version with its therapeutic accompanied version. Methods: A sample of 400 subjects that have a mild to severe depressive episode (Becks Depression Inventory - II and Hamilton Depression Scale) will be used. Subjects are randomly assigned to immediate access to an unaccompanied course (no support from psychologist via weekly phone calls), immediate access to an accompanied course (support from a psychologist via weekly phone calls) or a waiting list control group (access to the intervention after 24weeks). The intervention will last for a period of 12weeks. Depressive symptoms as a primary parameter, as well as various secondary parameters, such as life satisfaction, therapeutic relationships, social activation, self-esteem, attitudes towards Internet interventions and drop-out rates, are recorded at four different points in time: at baseline (T1), 6weeks after the start of the intervention (T2), 12weeks after the start of the intervention (T3) and 3months after completion of the treatment follow-up (T4). Conclusion: This randomized and controlled, blinded study will make use of a "dismantled" approach to adequately compare the accompanied and unaccompanied versions of the intervention. Positive and meaningful results are expected that could influence the acceptance and implementation of online interventions. Trial registration: German Clinical Trials Register DRKS00017191. Registered on 14 June 201

    Teaching old indicators even more tricks: binding affinity measurements with the guest-displacement assay (GDA)

    Get PDF
    A simple change has important consequences: the guest-displacement assay (GDA) is introduced which allows for binding affinity determinations of supramolecular complexes with spectroscopically silent hosts and guests. GDA is complementary to indicator-displacement assay for affinity measurements with soluble components, but is superior for insoluble or for weakly binding guests

    Influence of uncertainties in material parameters on finite element simulations

    Get PDF
    One fundamental part of engineering mechanics is the simulation of physical bodies on which critical decisions for the manufacturing process are based on. The main goal is to produce light and robust components in e.g. the automotive and aircraft industry, which are easy to manufacture in an efficient low cost production process. At the same time, the produced parts should always fulfill highest safety, stability and durability requirements. The necessary complex simulations are, however, subject to errors, which arise fromwidespread areas. First of all, the physical body needs to be described geometrically in the form of discretized meshes, which itself introduces approximations and with it uncertainties. Furthermore the body has to be described by a mathematical model, which characterizes the physical behaviour. In the context of the established finite element method these models view the underlying material only in respect to its behaviour based on several observations of the materials habits, not claiming a completely correct description of the material. These constitutive models are calibrated by the means of material parameters which need to be measured, and thus inhabit additional uncertainties. The solution to such constitutive models is also prone to errors, because the direct analytical solution of these complex mathematical models is typically impossible to obtain. Only approximate solutions in the time domain can be computed, which in combination with the listed uncertainties lead to reasonable but error-prone solutions. Although concepts for quantifying uncertainties were developed half a century ago and are quite common in other areas such as physics or mathematics, engineering mechanics typically operate by means of safety factors and boundaries instead of incorporating such quantitative uncertainty measures into the standard simulation approaches. The finite element method has proven to be a reliable tool for simulating problems of industrial scale and interest. In this work the well established sensitivity analysis is applied to finite element simulations in order to get a quantitative view on the errors resulting from uncertainties of material parameters. For this purpose the underlying structure of equations relevant to these simulations is analyzed in order to find possibilities of incorporating the uncertainty analysis into the finite element simulations in a way, which is numerically robust. Using the example of the constitutive models of hyperelasticity and viscoplasticity the overall procedure is investigated. From experimental results, which are used to derive the parameters of the model, to the simulation of three dimensional structures, it is explored, how all parts of this process are subject to uncertainties and how this influences the final outcome. Applying these concepts together with other sources of uncertainties, one will be able to investigate critical stresses and assess the certainty of these numerical predictions.Ein fundamentaler Bestandteil der Ingenieurwissenschaften ist die Simulation von physikalischen Körpern, welche fĂŒr kritische Entscheidungen im Produktionsprozess genutzt wird. Die Zielsetzung leichte und robuste Komponenten zu produzieren, welche möglichst effizient und zu geringen Kosten hergestellt werden können, wird z.B. in der Automobil und Luftfahrtindustrie aktiv verfolgt. Weiterhin muss sichergestellt werden, dass Bauteile höchsten Sicherheits-, StabilitĂ€ts- und Haltbarkeitsanforderungen entsprechen. Die dafĂŒr nötigen komplexen Simulationen sind eine NĂ€herung der realen UmstĂ€nde und beinhalten aus diesem Grund Unsicherheiten. So muss der physikalische Körper geometrisch beschrieben werden, was in Form von einer diskreten Vernetzung erfolgt. Das Materialverhalten des Körpers wird außerdem durch phĂ€nomenologische Materialmodelle beschrieben, welche das Verhalten mathematisch modellieren. Diese Modelle werden mit Hilfe ihrer bestimmenden Parameter kalibriert, die wiederum bestimmten Unsicherheiten unterliegen. Die numerische Lösung im Zuge der Anwendung der Finite-Elemente-Methode (FEM) bringt ebenfalls weitere Unsicherheiten ein. Obwohl Konzepte zur Quantifizierung dieser Unsicherheiten schon seit mehreren Jahrzehnten existieren, werden diese hauptsĂ€chlich in den Naturwissenschaften wie z.B. der Physik und Mathematik angewandt. Im Ingenieurwesen arbeitet man ĂŒblicherweise mit Sicherheitsfaktoren anstatt die Unsicherheiten konkret zu beziffern und in die Standardvorgehensweise einzubeziehen. Mit der Methode der finiten Elemente, welche sich als zuverlĂ€ssiges Werkzeug fĂŒr die Simulation von Problemen im industriellen Maßstab erwiesen hat, lĂ€sst sich eine solche Quantifizierung jedoch ebenfalls durchfĂŒhren. In dieser Arbeit wird die gut etablierte SensitivitĂ€tsanalyse in Anwendung auf die FEM genutzt, um die Fortpflanzung der Unsicherheiten in Materialparametern in Finite-Elemente-Simulationen zu untersuchen und zu charakterisieren. Zu diesem Zweck wird die relevante zugrundeliegende Gleichungsstruktur untersucht und es werden Möglichkeiten aufgezeigt, diese auszunutzen, um die Konzepte der SensitivitĂ€tsanalyse fĂŒr dreidimensionale Strukturen in Form einer Unsicherheitsanalyse anzuwenden. Es werden die konstitutiven Modelle der HyperelastizitĂ€t und ViskoplastizitĂ€t an zwei konkreten Beispielen untersucht. Ausgehend von experimentellen Ergebnissen, aus denen die Parameter des Modells sowie deren Unsicherheiten bestimmt werden, wird analysiert, wie diese sich auf die untersuchte Struktur auswirken

    First Results of the PixelGEM Central Tracking System for COMPASS

    Full text link
    For its physics program with a high-intensity hadron beam of up to 2e7 particles/s, the COMPASS experiment at CERN requires tracking of charged particles scattered by very small angles with respect to the incident beam direction. While good resolution in time and space is mandatory, the challenge is imposed by the high beam intensity, requiring radiation-hard detectors which add very little material to the beam path in order to minimize secondary interactions. To this end, a set of triple-GEM detectors with a hybrid readout structure consisting of pixels in the beam region and 2-D strips in the periphery was designed and built. Successful prototype tests proved the performance of this new detector type, showing both extraordinary high rate capability and detection efficiency. The amplitude information allowed to achieve spatial resolutions about a factor of 10 smaller than the pitch and a time resolution close to the theoretical limit imposed by the layout. The PixelGEM central tracking system consisting of five detectors, slightly improved with respect to the prototype, was completely installed in the COMPASS spectrometer in spring 2008
    • 

    corecore